Two Convergence Results for Continuous Descent Methods

نویسندگان

  • SIMEON REICH
  • ALEXANDER J. ZASLAVSKI
چکیده

We consider continuous descent methods for the minimization of convex functionals defined on general Banach space. We establish two convergence results for methods which are generated by regular vector fields. Since the complement of the set of regular vector fields is σ-porous, we conclude that our results apply to most vector fields in the sense of Baire’s categories.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence Results for a Class of Abstract Continuous Descent Methods

We study continuous descent methods for the minimization of Lipschitzian functions defined on a general Banach space. We establish convergence theorems for those methods which are generated by approximate solutions to evolution equations governed by regular vector fields. Since the complement of the set of regular vector fields is σ-porous, we conclude that our results apply to most vector fiel...

متن کامل

A new Levenberg-Marquardt approach based on Conjugate gradient structure for solving absolute value equations

In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdir...

متن کامل

Two Settings of the Dai-Liao Parameter Based on Modified Secant Equations

Following the setting of the Dai-Liao (DL) parameter in conjugate gradient (CG) methods‎, ‎we introduce two new parameters based on the modified secant equation proposed by Li et al‎. ‎(Comput‎. ‎Optim‎. ‎Appl‎. ‎202:523-539‎, ‎2007) with two approaches‎, ‎which use an extended new conjugacy condition‎. ‎The first is based on a modified descent three-term search direction‎, ‎as the descent Hest...

متن کامل

On the convergence speed of artificial neural networks in‎ ‎the solving of linear ‎systems

‎Artificial neural networks have the advantages such as learning, ‎adaptation‎, ‎fault-tolerance‎, ‎parallelism and generalization‎. ‎This ‎paper is a scrutiny on the application of diverse learning methods‎ ‎in speed of convergence in neural networks‎. ‎For this aim‎, ‎first we ‎introduce a perceptron method based on artificial neural networks‎ ‎which has been applied for solving a non-singula...

متن کامل

A Generic Convergence Theorem for Continuous Descent Methods in Banach Spaces

We study continuous descent methods for minimizing convex functions defined on general Banach spaces and prove that most of them (in the sense of Baire category) converge.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003